104 research outputs found

    Compressive Sensing for Remote Flood Monitoring

    Get PDF
    Although wireless sensor networks (WSNs) are considered as one of the prominent solutions for flood monitoring; however, the energy constraint nature of the sensors is still a technical challenge. In this paper, we tackle this problem by proposing a novel energy-efficient remote flood monitoring system, enabled by compressive sensing. The proposed approach compressively captures water level data using; i) a random block-based sampler, and ii) a gradient-based compressive sensing approach, at a very low rate, exploiting water level data variability over time. Through extensive experiments on real water-level dataset, we show that the number of packet transmissions as well as the size of packets are significantly reduced. The results also demonstrate significant energy reduction in sensing and transmission. Moreover, data reconstruction from compressed samples are of high quality with negligible degradation, compared to classic compression techniques, even at high compression rates

    Robust Data Transmission Rate Allocation to Improve Energy Efficiency in 6G Networks

    Get PDF
    The future sixth-generation (6G) network is expected to support both sensing and communications. Since the sensing performance will highly rely on the residual battery of smart devices, energy efficiency is one of the main concerns in the design of 6G. Motivated by these facts, we design an energy efficient data transmission rate allocation approach for 6G networks. To have a more realistic deployment, we assume that perfect channel state information is not available. Imperfect channel state information (CSI) might waste energy sources or degrade quality of service (QoS). Thus, we apply the maximum likelihood estimation (MLE) method to estimate the true channel characteristics for a given set of observations. The proposed approach is robust against unknown channel statistics and allows adapting UEs transmission rate to the channel quality which reduce energy consumption and guarantees QoS. Both numerical analysis and simulation results confirm the effectiveness of the proposed work in terms of energy efficiency and throughput maximization

    Robust Channel Estimation in Multiuser Downlink 5G Systems Under Channel Uncertainties

    Get PDF
    In wireless communication, the performance of the network highly relies on the accuracy of channel state information (CSI). On the other hand, the channel statistics are usually unknown, and the measurement information is lost due to the fading phenomenon. Therefore, we propose a channel estimation approach for downlink communication under channel uncertainty. We apply the Tobit Kalman filter (TKF) method to estimate the hidden state vectors of wireless channels. To minimize the maximum estimation error, a robust minimax minimum estimation error (MSE) estimation approach is developed while the QoS requirements of wireless users is taken into account. We then formulate the minimax problem as a non-cooperative game to find an optimal filter and adjust the best behavior for the worst-case channel uncertainty. We also investigate a scenario in which the actual operating point is not exactly known under model uncertainty. Finally, we investigate the existence and characterization of a saddle point as the solution of the game. Theoretical analysis verifies that our work is robust against the uncertainty of the channel statistics and able to track the true values of the channel states. Additionally, simulation results demonstrate the superiority of the model in terms of MSE value over related techniques

    Dynamic Resource Allocation Model for Distribution Operations using SDN

    Get PDF
    In vehicular ad-hoc networks, autonomous vehicles generate a large amount of data prior to support in-vehicle applications. So, a big storage and high computation platform is needed. On the other hand, the computation for vehicular networks at the cloud platform requires low latency. Applying edge computation (EC) as a new computing paradigm has potentials to provide computation services while reducing the latency and improving the total utility. We propose a three-tier EC framework to set the elastic calculating processing capacity and dynamic route calculation to suitable edge servers for real-time vehicle monitoring. This framework includes the cloud computation layer, EC layer, and device layer. The formulation of resource allocation approach is similar to an optimization problem. We design a new reinforcement learning (RL) algorithm to deal with resource allocation problem assisted by cloud computation. By integration of EC and software defined networking (SDN), this study provides a new software defined networking edge (SDNE) framework for resource assignment in vehicular networks. The novelty of this work is to design a multi-agent RL-based approach using experience reply. The proposed algorithm stores the users’ communication information and the network tracks’ state in real-time. The results of simulation with various system factors are presented to display the efficiency of the suggested framework. We present results with a real-world case study

    An accurate RSS/AoA-based localization method for internet of underwater things

    Get PDF
    Localization is an important issue for Internet of Underwater Things (IoUT) since the performance of a large number of underwater applications highly relies on the position information of underwater sensors. In this paper, we propose a hybrid localization approach based on angle-of-arrival (AoA) and received signal strength (RSS) for IoUT. We consider a smart fishing scenario in which using the proposed approach fishers can find fishes’ locations effectively. The proposed method collects the RSS observation and estimates the AoA based on error variance. To have a more realistic deployment, we assume that the perfect noise information is not available. Thus, a minimax approach is provided in order to optimize the worst-case performance and enhance the estimation accuracy under the unknown parameters. Furthermore, we analyze the mismatch of the proposed estimator using mean-square error (MSE). We then develop semidefinite programming (SDP) based method which relaxes the non-convex constraints into the convex constraints to solve the localization problem in an efficient way. Finally, the Cramer–Rao lower bounds (CRLBs) are derived to bound the performance of the RSS-based estimator. In comparison with other localization schemes, the proposed method increases localization accuracy by more than 13%. Our method can localize 96% of sensor nodes with less than 5% positioning error when there exist 25% anchors

    Dynamic Resource Allocation Model for Distribution Operations using SDN

    Get PDF
    In vehicular ad-hoc networks, autonomous vehicles generate a large amount of data prior to support in-vehicle applications. So, a big storage and high computation platform is needed. On the other hand, the computation for vehicular networks at the cloud platform requires low latency. Applying edge computation (EC) as a new computing paradigm has potentials to provide computation services while reducing the latency and improving the total utility. We propose a three-tier EC framework to set the elastic calculating processing capacity and dynamic route calculation to suitable edge servers for real-time vehicle monitoring. This framework includes the cloud computation layer, EC layer, and device layer. The formulation of resource allocation approach is similar to an optimization problem. We design a new reinforcement learning (RL) algorithm to deal with resource allocation problem assisted by cloud computation. By integration of EC and software defined networking (SDN), this study provides a new software defined networking edge (SDNE) framework for resource assignment in vehicular networks. The novelty of this work is to design a multi-agent RL-based approach using experience reply. The proposed algorithm stores the users’ communication information and the network tracks’ state in realtime. The results of simulation with various system factors are presented to display the efficiency of the suggested framework. We present results with a real-world case stud

    Hybridisation of genetic algorithm with simulated annealing for vertical-handover in heterogeneous wireless networks

    Get PDF
    To provide the seamless mobility in heterogeneous wireless networks two significant methods, simulated annealing (SA) and genetic algorithms (GAs) are hybrid. In this paradigm, vertical handovers (VHs) are necessary for seamless mobility. In this paper, the hybrid algorithm has the ability to find the optimal network to connect with a good quality of service (QoS) in accordance with the user's preferences. The intelligent algorithm was developed to provide solutions near to real time and to avoid slow and considerable computations according to the features of the mobile devices. Moreover, a cost function is used to sustain the chosen QoS during transition between networks, which is measured in terms of the bandwidth, BER, ABR, SNR and monetary cost. Simulation results presented that choosing the SA rules would minimise the cost function and the GA-SA algorithm could reduce the number of unnecessary handovers, and thereby avoid the 'Ping-Pong' effect

    Employing Unmanned Aerial Vehicles for Improving Handoff using Cooperative Game Theory

    Get PDF
    Heterogeneous wireless networks that are used for seamless mobility are expected to face prominent problems in future 5G cellular networks. Due to their proper flexibility and adaptable preparation, remote-controlled Unmanned Aerial Vehicles (UAVs) could assist heterogeneous wireless communication. However, the key challenges of current UAV-assisted communications consist in having appropriate accessibility over wireless networks via mobile devices with an acceptable Quality of Service (QoS) grounded on the users' preferences. To this end, we propose a novel method based on cooperative game theory to select the best UAV during handover process and optimize handover among UAVs by decreasing the (i) end-to-end delay, (ii) handover latency and (iii) signaling overheads. Moreover, the standard design of Software Defined Network (SDN) with Media Independent Handover (MIH) is used as forwarding switches in order to obtain seamless mobility. Numerical results derived from the real data are provided to illustrate the effectiveness of the proposed approach in terms of number of handovers, cost and delay

    Sustainable Edge Node Computing Deployments in Distributed Manufacturing Systems

    Get PDF
    The advancement of mobile internet technology has created opportunities for integrating the Industrial Internet of Things (IIoT) and edge computing in smart manufacturing. These sustainable technologies enable intelligent devices to achieve high-performance computing with minimal latency. This paper introduces a novel approach to deploy edge computing nodes in smart manufacturing environments at a low cost. However, the intricate interactions among network sensors, equipment, service levels, and network topologies in smart manufacturing systems pose challenges to node deployment. To address this, the proposed sustainable game theory method identifies the optimal edge computing node for deployment to attain the desired outcome. Additionally, the standard design of Software Defined Network (SDN) in conjunction with edge computing serves as forwarding switches to enhance overall computing services. Simulations demonstrate the effectiveness of this approach in reducing network delay and deployment costs associated with computing resources. Given the significance of sustainability, cost efficiency plays a critical role in establishing resilient edge networks. Our numerical and simulation results validate that the proposed scheme surpasses existing techniques like shortest estimated latency first (SELF), shortest estimated buffer first (SEBF), and random deployment (RD) in minimizing the total cost of deploying edge nodes, network delay, packet loss, and energy consumption

    Skin Cancer Diagnosis Based on Neutrosophic Features with a Deep Neural Network.

    Get PDF
    Recent years evidenced an increase in the total number of skin cancer cases, and it is projected to grow exponentially. This paper proposes a computer-aided diagnosis system for the classification of a malignant lesion, where the acquired image is primarily pre-processed using novel methods. Digital artifacts such as hair follicles and blood vessels are removed, and thereafter, the image is enhanced using a novel method of histogram equalization. Henceforth, the pre-processed image undergoes the segmentation phase, where the suspected lesion is segmented using the Neutrosophic technique. The segmentation method employs a thresholding-based method along with a pentagonal neutrosophic structure to form a segmentation mask of the suspected skin lesion. The paper proposes a deep neural network base on Inception and residual blocks with softmax block after each residual block which makes the layer wider and easier to learn the key features more quickly. The proposed classifier was trained, tested, and validated over PH2, ISIC 2017, ISIC 2018, and ISIC 2019 datasets. The proposed segmentation model yields an accuracy mark of 99.50%, 99.33%, 98.56% and 98.04% for these datasets, respectively. These datasets are augmented to form a total of 103,554 images for training, which make the classifier produce enhanced classification results. Our experimental results confirm that the proposed classifier yields an accuracy score of 99.50%, 99.33%, 98.56%, and 98.04% for PH2, ISIC 2017, 2018, and 2019, respectively, which is better than most of the pre-existing classifiers
    • …
    corecore